Checked content

Cauchy–Schwarz inequality

Related subjects: Mathematics

About this schools Wikipedia selection

SOS Children made this Wikipedia selection alongside other schools resources. SOS Children has looked after children in Africa for forty years. Can you help their work in Africa?

In mathematics, the Cauchy–Schwarz inequality, also known as the Schwarz inequality, the Cauchy inequality, or the Cauchy–Bunyakovsky–Schwarz inequality, is a useful inequality encountered in many different settings, such as linear algebra applied to vectors, in analysis applied to infinite series and integration of products, and in probability theory, applied to variances and covariances.

The inequality for sums was published by Augustin Cauchy (1821), while the corresponding inequality for integrals was first stated by Viktor Yakovlevich Bunyakovsky (1859) and rediscovered by Hermann Amandus Schwarz (1888) (often misspelled "Schwartz").

Statement of the inequality

The Cauchy–Schwarz inequality states that for all vectors x and y of a real or complex inner product space,

|\langle x,y\rangle|^2 \leq \langle x,x\rangle \cdot \langle y,y\rangle,

where \langle\cdot,\cdot\rangle is the inner product. Equivalently, by taking the square root of both sides, and referring to the norms of the vectors, the inequality is written as

 |\langle x,y\rangle| \leq \|x\| \cdot \|y\|.\,

Moreover, the two sides are equal if and only if x and y are linearly dependent (or, in a geometrical sense, they are parallel or one of the vectors is equal to zero).

If x_1,\cdots, x_n\in\mathbb C and y_1,\cdots, y_n\in\mathbb C are the components of x and y with respect to an orthonormal basis of V the inequality may be restated in a more explicit way as follows:

|\overline{x}_1 y_1 %2B \cdots %2B \overline{x}_n y_n|^2 \leq (|x_1|^2 %2B \cdots %2B |x_n|^2) (|y_1|^2 %2B \cdots %2B |y_n|^2).

Equality holds if and only if either x=0, or there exists a scalar \lambda such that

 y_1 = \lambda x_1, \ y_2 = \lambda x_2, \dots, y_n = \lambda x_n.

The finite-dimensional case of this inequality for real vectors was proved by Cauchy in 1821, and in 1859 Cauchy's student V.Ya. Bunyakovsky noted that by taking limits one can obtain an integral form of Cauchy's inequality. The general result for an inner product space was obtained by K.H.A.Schwarz in 1885.

Proof

As the inequality is trivially true in the case y = 0, we may assume <y, y> is nonzero. Let  \lambda be a complex number. Then,

 0 \leq \left\| x-\lambda y \right\|^2
= \langle x-\lambda y,x-\lambda y \rangle = \langle x,x \rangle - \bar{\lambda} \langle x,y \rangle - \lambda \langle y,x \rangle %2B |\lambda|^2 \langle y,y\rangle.

Choosing

 \lambda = \langle x,y \rangle \cdot \langle y,y \rangle^{-1}

we obtain

 0 \leq \langle x,x \rangle - |\langle x,y \rangle|^2 \cdot \langle y,y \rangle^{-1}

which is true if and only if

 |\langle x,y \rangle|^2 \leq \langle x,x \rangle \cdot \langle y,y \rangle

or equivalently:

 \big| \langle x,y \rangle \big|
\leq \left\|x\right\| \left\|y\right\|,

which is the Cauchy–Schwarz inequality.

Notable special cases

Rn

In Euclidean space Rn with the standard inner product, the Cauchy–Schwarz inequality is

\left(\sum_{i=1}^n x_i y_i\right)^2\leq \left(\sum_{i=1}^n x_i^2\right) \left(\sum_{i=1}^n y_i^2\right).

In this special case, an alternative proof is as follows: Consider the polynomial in z

(x_1 z %2B y_1)^2 %2B \cdots %2B (x_n z %2B y_n)^2.

Note that the polynomial is quadratic in z. Since the polynomial is nonnegative, it cannot have any roots unless all the ratios xi/yi are equal. Hence its discriminant is less than or equal to zero, that is,

\left(\sum ( x_i \cdot y_i ) \right)^2 - \sum {x_i^2} \cdot \sum {y_i^2} \le 0,

which yields the Cauchy–Schwarz inequality.

An equivalent proof for Rn starts with the summation below.

Expanding the brackets we have:

 \sum_{i=1}^n \sum_{j=1}^n \left( x_i y_j - x_j y_i \right)^2 

= \sum_{i=1}^n x_i^2 \sum_{j=1}^n y_j^2 %2B \sum_{j=1}^n x_j^2 \sum_{i=1}^n y_i^2 
- 2 \sum_{i=1}^n x_i y_i \sum_{j=1}^n x_j y_j ,

collecting together identical terms (albeit with different summation indices) we find:

 \frac{1}{2} \sum_{i=1}^n \sum_{j=1}^n \left( x_i y_j - x_j y_i \right)^2 

= \sum_{i=1}^n x_i^2 \sum_{i=1}^n y_i^2 - \left( \sum_{i=1}^n x_i y_i \right)^2 .

Because the left-hand side of the equation is a sum of the squares of real numbers it is greater than or equal to zero, thus:


\sum_{i=1}^n x_i^2 \sum_{i=1}^n y_i^2 - \left( \sum_{i=1}^n x_i y_i \right)^2 \geq 0 
.

Also, when n = 2 or 3, the dot product is related to the angle between two vectors and one can immediately see the inequality:

|x \cdot y| = \|x\| \|y\| | \cos \theta | \le \|x\| \|y\|.

Furthermore, in this case the Cauchy–Schwarz inequality can also be deduced from Lagrange's identity. For n = 3, Lagrange's identity takes the form

\langle x,x\rangle \cdot \langle y,y\rangle = |\langle x,y\rangle|^2 %2B |x \times y|^2

from which readily follows the Cauchy–Schwarz inequality.

L2

For the inner product space of square-integrable complex-valued functions, one has

\left|\int f(x) \overline{g}(x)\,dx\right|^2\leq\int \left|f(x)\right|^2\,dx \cdot \int\left|g(x)\right|^2\,dx.

A generalization of this is the Hölder inequality.

Use

The triangle inequality for the inner product is often shown as a consequence of the Cauchy–Schwarz inequality, as follows: given vectors x and y,

\|x %2B y\|^2 = \langle x %2B y, x %2B y \rangle
= \|x\|^2 %2B \langle x, y \rangle %2B \langle y, x \rangle %2B \|y\|^2
\le \|x\|^2 %2B 2|\langle x, y \rangle| %2B \|y\|^2
\le \|x\|^2 %2B 2\|x\|\|y\| %2B \|y\|^2
\le \left(\|x\| %2B \|y\|\right)^2

Taking the square roots gives the triangle inequality.

The Cauchy–Schwarz inequality allows one to extend the notion of "angle between two vectors" to any real inner product space, by defining:


\cos\theta_{xy}=\frac{\langle x,y\rangle}{\|x\| \|y\|}

The Cauchy–Schwarz inequality proves that this definition is sensible, by showing that the right hand side lies in the interval [-1,1], and justifies the notion that real inner product spaces are simply generalizations of the Euclidean space.

The Cauchy–Schwarz is used to prove that the inner product is a continuous function with respect to the topology induced by the inner product itself.

The Cauchy–Schwarz inequality is usually used to show Bessel's inequality.

The general formulation of the Heisenberg uncertainty principle is derived using the Cauchy–Schwarz inequality in the inner product space of physical wave functions.

Generalizations

Various generalizations of the Cauchy–Schwarz inequality exist in the context of operator theory, e.g. for operator-convex functions, and operator algebras, where the domain and/or range of φ are replaced by a C*-algebra or W*-algebra.

This section lists a few of such inequalities from the operator algebra setting, to give a flavor of results of this type.

Positive functionals on C*- and W*-algebras

One can discuss inner products as positive functionals. Given a Hilbert space L2(m), m being a finite measure, the inner product < · , · > gives rise to a positive functional φ by

\phi (g) = \langle g, 1 \rangle.

Since < f,f > ≥ 0, φ(f*f) ≥ 0 for all f in L2(m), where f* is pointwise conjugate of f. So φ is positive. Conversely every positive functional φ gives a corresponding inner product < f , g >φ = φ(g*f). In this language, the Cauchy–Schwarz inequality becomes

| \phi(g^*f) |^2 \leq \phi(f^*f) \phi(g^*g) ,

which extends verbatim to positive functionals on C*-algebras.

We now give an operator theoretic proof for the Cauchy–Schwarz inequality which passes to the C*-algebra setting. One can see from the proof that the Cauchy–Schwarz inequality is a consequence of the positivity and anti-symmetry inner-product axioms.

Consider the positive matrix


M =
\begin{bmatrix}
f^*\\
g^*
\end{bmatrix}
\begin{bmatrix}
f & g
\end{bmatrix}
=
\begin{bmatrix}
f^*f & f^* g \\
g^*f & g^*g
\end{bmatrix}.

Since φ is a positive linear map whose range, the complex numbers C, is a commutative C*-algebra, φ is completely positive. Therefore


M' = (I_2 \otimes \phi)(M) =
\begin{bmatrix}
\phi(f^*f) & \phi(f^* g) \\
\phi(g^*f) & \phi(g^*g)
\end{bmatrix}

is a positive 2 × 2 scalar matrix, which implies it has positive determinant:


\phi(f^*f) \phi(g^*g) - | \phi(g^*f) |^2 \geq 0 \quad \mbox{i.e.} \quad \phi(f^*f) \phi(g^*g) \geq | \phi(g^*f) |^2.

This is precisely the Cauchy–Schwarz inequality. If f and g are elements of a C*-algebra, f* and g* denote their respective adjoints.

We can also deduce from above that every positive linear functional is bounded, corresponding to the fact that the inner product is jointly continuous.

Positive maps

Positive functionals are special cases of positive maps. A linear map Φ between C*-algebras is said to be a positive map if a ≥ 0 implies Φ(a) ≥ 0. It is natural to ask whether inequalities of Schwarz-type exist for positive maps. In this more general setting, usually additional assumptions are needed to obtain such results.

Kadison's inequality

One such inequality is the following:

Theorem If Φ is a unital positive map, then for every normal element a in its domain, we have Φ(a*a) ≥ Φ(a*)Φ(a) and Φ(a*a) ≥ Φ(a)Φ(a*).

This extends the fact φ(a*a) · 1 ≥ φ(a)*φ(a) = |φ(a)|2, when φ is a linear functional.

The case when a is self-adjoint, i.e. a = a*, is known as Kadison's inequality.

2-positive maps

When Φ is 2-positive, a stronger assumption than merely positive, one has something that looks very similar to the original Cauchy–Schwarz inequality:

Theorem (Modified Schwarz inequality for 2-positive maps) For a 2-positive map Φ between C*-algebras, for all a, b in its domain,

i) Φ(a)*Φ(a) ≤ ||Φ(1)|| Φ(a*a).
ii) ||Φ(a*b)||2 ≤ ||Φ(a*a)|| · ||Φ(b*b)||.

A simple argument for ii) is as follows. Consider the positive matrix


M= 
\begin{bmatrix}
a^* & 0 \\
b^* & 0
\end{bmatrix}
\begin{bmatrix}
a & b \\
0 & 0
\end{bmatrix}
=
\begin{bmatrix}
a^*a & a^* b \\
b^*a & b^*b
\end{bmatrix}.

By 2-positivity of Φ,


(I_2 \otimes \Phi) M = 
\begin{bmatrix}
\Phi(a^*a) & \Phi(a^* b) \\
\Phi(b^*a) & \Phi(b^*b)
\end{bmatrix}

is positive. The desired inequality then follows from the properties of positive 2 × 2 (operator) matrices.

Part i) is analogous. One can replace the matrix \begin{bmatrix} a & b \\ 0 & 0 \end{bmatrix} by \begin{bmatrix} 1 & a \\ 0 & 0 \end{bmatrix}.

Retrieved from " http://en.wikipedia.org/w/index.php?title=Cauchy–Schwarz_inequality&oldid=199248245"